Theoretical and numerical comparison of first order algorithms for cocoercive equations and smooth convex optimization

نویسندگان

چکیده

This paper provides a theoretical and numerical comparison of classical first-order splitting methods for solving smooth convex optimization problems cocoercive equations. From point view, we compare convergence rates gradient descent, forward-backward, Peaceman-Rachford, Douglas-Rachford algorithms minimizing the sum two functions when one them is strongly convex. A similar given in more general setting under presence strong monotonicity observe that are strictly better than corresponding equations some algorithms. We obtain improved with respect to literature several instances by exploiting structure our problems. Moreover, indicate which algorithm has smallest rate depending on convexity parameters. verify results implementing comparing previous well established signal image inverse involving sparsity. replace widely used ?1 norm Huber loss fully proximal-based strategies have advantages using steps.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relatively-Smooth Convex Optimization by First-Order Methods, and Applications

The usual approach to developing and analyzing first-order methods for smooth convex optimization assumes that the gradient of the objective function is uniformly smooth with some Lipschitz constant L. However, in many settings the differentiable convex function f(·) is not uniformly smooth – for example in D-optimal design where f(x) := − ln det(HXH ), or even the univariate setting with f(x) ...

متن کامل

First-order methods of smooth convex optimization with inexact oracle

In this paper, we analyze different first-order methods of smooth convex optimization employing inexact first-order information. We introduce the notion of an approximate first-order oracle. The list of examples of such an oracle includes smoothing technique, Moreau-Yosida regularization, Modified Lagrangians, and many others. For different methods, we derive complexity estimates and study the ...

متن کامل

Stochastic first order methods in smooth convex optimization

In this paper, we are interested in the development of efficient first-order methods for convex optimization problems in the simultaneous presence of smoothness of the objective function and stochasticity in the first-order information. First, we consider the Stochastic Primal Gradient method, which is nothing else but the Mirror Descent SA method applied to a smooth function and we develop new...

متن کامل

the study of practical and theoretical foundation of credit risk and its coverage

پس از بررسی هر کدام از فاکتورهای نوع صنعت, نوع ضمانت نامه, نرخ بهره , نرخ تورم, ریسک اعتباری کشورها, کارمزد, ریکاوری, gdp, پوشش و وثیقه بر ریسک اعتباری صندوق ضمانت صادرات ایران مشخص گردید که همه فاکتورها به استثنای ریسک اعتباری کشورها و کارمزد بقیه فاکتورها رابطه معناداری با ریسک اعتباری دارند در ضمن نرخ بهره , نرخ تورم, ریکاوری, و نوع صنعت و ریسک کشورها اثر عکس روی ریسک اعتباری داردو پوشش, وثی...

15 صفحه اول

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Signal Processing

سال: 2023

ISSN: ['0165-1684', '1872-7557']

DOI: https://doi.org/10.1016/j.sigpro.2022.108900